23 research outputs found

    Depth perception not found in human observers for static or dynamic anti-correlated random dot stereograms

    Get PDF
    One of the greatest challenges in visual neuroscience is that of linking neural activity with perceptual experience. In the case of binocular depth perception, important insights have been achieved through comparing neural responses and the perception of depth, for carefully selected stimuli. One of the most important types of stimulus that has been used here is the anti-correlated random dot stereogram (ACRDS). In these stimuli, the contrast polarity of one half of a stereoscopic image is reversed. While neurons in cortical area V1 respond reliably to the binocular disparities in ACRDS, they do not create a sensation of depth. This discrepancy has been used to argue that depth perception must rely on neural activity elsewhere in the brain. Currently, the psychophysical results on which this argument rests are not clear-cut. While it is generally assumed that ACRDS do not support the perception of depth, some studies have reported that some people, some of the time, perceive depth in some types of these stimuli. Given the importance of these results for understanding the neural correlates of stereopsis, we studied depth perception in ACRDS using a large number of observers, in order to provide an unambiguous conclusion about the extent to which these stimuli support the perception of depth. We presented observers with random dot stereograms in which correlated dots were presented in a surrounding annulus and correlated or anti-correlated dots were presented in a central circular region. While observers could reliably report the depth of the central region for correlated stimuli, we found no evidence for depth perception in static or dynamic anti-correlated stimuli. Confidence ratings for stereoscopic perception were uniformly low for anti-correlated stimuli, but showed normal variation with disparity for correlated stimuli. These results establish that the inability of observers to perceive depth in ACRDS is a robust phenomenon

    Oceanographic Variability in Cumberland Bay, South Georgia, and Its Implications for Glacier Retreat

    Get PDF
    South Georgia is a heavily glaciated sub-Antarctic island in the Southern Ocean. Cumberland Bay is the largest fjord on the island, split into two arms, each with a large marine-terminating glacier at the head. Although these glaciers have shown markedly different retreat rates over the past century, the underlying drivers of such differential retreat are not yet understood. This study uses observations and a new high-resolution oceanographic model to characterize oceanographic variability in Cumberland Bay and to explore its influence on glacier retreat. While observations indicate a strong seasonal cycle in temperature and salinity, they reveal no clear hydrographic differences that could explain the differential glacier retreat. Model simulations suggest the subglacial outflow plume dynamics and fjord circulation are sensitive to the bathymetry adjacent to the glacier, though this does not provide persuasive reasoning for the asymmetric glacier retreat. The addition of a postulated shallow inner sill in one fjord arm, however, significantly changes the water properties in the resultant inner basin by blocking the intrusion of colder, higher salinity waters at depth. This increase in temperature could significantly increase submarine melting, which is proposed as a possible contribution to the different rates of glacier retreat observed in the two fjord arms. This study represents the first detailed description of the oceanographic variability of a sub-Antarctic island fjord, highlighting the sensitivity of fjord oceanography to bathymetry. Notably, in fjords systems where temperature decreases with depth, the presence of a shallow sill has the potential to accelerate glacier retreat

    Transitions in modes of coastal adaptation: addressing blight, engagement and sustainability

    Get PDF
    Coastal defences have long provided protection from erosion and flooding to cities, towns and villages. In many parts of the world, continued defence is being questioned due to both environmental, sustainability and economic considerations. This is exemplified in England and Wales, where strategic Shoreline Management Plans envisage realignment of many protected coasts, often with low population densities, over the coming decades. The policy transition from protection to realignment is often resisted by affected communities and can have high political costs. Whilst some preparations for such transitions have been made, the communities affected are often not fully aware of the implications of policy change, and this brings the potential for blight. In this paper, we investigate the challenges of implementing transitions in coastal policy within England and Wales. The analysis is based on data obtained from three workshops held in 2019 that were attended by council members, engineers, planners, scientists and other relevant professionals. Five conditions are found to promote contention: (i) policy actors with competing priorities and different decision making time frames (immediate to decadal to a century); (ii) divergence between regulations and ad hoc political decisions (e.g. in relation to the demand for new housing); (iii) limited or non-existent funding to support policy transition; (iv) community expectation that protection is forever; and (v) a disconnection between people and ongoing coastal change. Our research indicates that transitions can be better supported through: (1) integrated multi-scalar preparedness for coastal change; (2) an accessible evidence base and future vision to nurture political confidence in adaptation; and (3) defined, time-bound and accessible diverse funding streams to achieve transitions. Critically, these generic actions need to be embedded within the local political and planning system to facilitate transition to more sustainable coasts and their communities

    Global patient outcomes after elective surgery: prospective cohort study in 27 low-, middle- and high-income countries.

    Get PDF
    BACKGROUND: As global initiatives increase patient access to surgical treatments, there remains a need to understand the adverse effects of surgery and define appropriate levels of perioperative care. METHODS: We designed a prospective international 7-day cohort study of outcomes following elective adult inpatient surgery in 27 countries. The primary outcome was in-hospital complications. Secondary outcomes were death following a complication (failure to rescue) and death in hospital. Process measures were admission to critical care immediately after surgery or to treat a complication and duration of hospital stay. A single definition of critical care was used for all countries. RESULTS: A total of 474 hospitals in 19 high-, 7 middle- and 1 low-income country were included in the primary analysis. Data included 44 814 patients with a median hospital stay of 4 (range 2-7) days. A total of 7508 patients (16.8%) developed one or more postoperative complication and 207 died (0.5%). The overall mortality among patients who developed complications was 2.8%. Mortality following complications ranged from 2.4% for pulmonary embolism to 43.9% for cardiac arrest. A total of 4360 (9.7%) patients were admitted to a critical care unit as routine immediately after surgery, of whom 2198 (50.4%) developed a complication, with 105 (2.4%) deaths. A total of 1233 patients (16.4%) were admitted to a critical care unit to treat complications, with 119 (9.7%) deaths. Despite lower baseline risk, outcomes were similar in low- and middle-income compared with high-income countries. CONCLUSIONS: Poor patient outcomes are common after inpatient surgery. Global initiatives to increase access to surgical treatments should also address the need for safe perioperative care. STUDY REGISTRATION: ISRCTN5181700

    Genetic mechanisms of critical illness in COVID-19.

    Get PDF
    Host-mediated lung inflammation is present1, and drives mortality2, in the critical illness caused by coronavirus disease 2019 (COVID-19). Host genetic variants associated with critical illness may identify mechanistic targets for therapeutic development3. Here we report the results of the GenOMICC (Genetics Of Mortality In Critical Care) genome-wide association study in 2,244 critically ill patients with COVID-19 from 208 UK intensive care units. We have identified and replicated the following new genome-wide significant associations: on chromosome 12q24.13 (rs10735079, P = 1.65 × 10-8) in a gene cluster that encodes antiviral restriction enzyme activators (OAS1, OAS2 and OAS3); on chromosome 19p13.2 (rs74956615, P = 2.3 × 10-8) near the gene that encodes tyrosine kinase 2 (TYK2); on chromosome 19p13.3 (rs2109069, P = 3.98 ×  10-12) within the gene that encodes dipeptidyl peptidase 9 (DPP9); and on chromosome 21q22.1 (rs2236757, P = 4.99 × 10-8) in the interferon receptor gene IFNAR2. We identified potential targets for repurposing of licensed medications: using Mendelian randomization, we found evidence that low expression of IFNAR2, or high expression of TYK2, are associated with life-threatening disease; and transcriptome-wide association in lung tissue revealed that high expression of the monocyte-macrophage chemotactic receptor CCR2 is associated with severe COVID-19. Our results identify robust genetic signals relating to key host antiviral defence mechanisms and mediators of inflammatory organ damage in COVID-19. Both mechanisms may be amenable to targeted treatment with existing drugs. However, large-scale randomized clinical trials will be essential before any change to clinical practice

    Effectiveness of a national quality improvement programme to improve survival after emergency abdominal surgery (EPOCH): a stepped-wedge cluster-randomised trial

    Get PDF
    BACKGROUND: Emergency abdominal surgery is associated with poor patient outcomes. We studied the effectiveness of a national quality improvement (QI) programme to implement a care pathway to improve survival for these patients. METHODS: We did a stepped-wedge cluster-randomised trial of patients aged 40 years or older undergoing emergency open major abdominal surgery. Eligible UK National Health Service (NHS) hospitals (those that had an emergency general surgical service, a substantial volume of emergency abdominal surgery cases, and contributed data to the National Emergency Laparotomy Audit) were organised into 15 geographical clusters and commenced the QI programme in a random order, based on a computer-generated random sequence, over an 85-week period with one geographical cluster commencing the intervention every 5 weeks from the second to the 16th time period. Patients were masked to the study group, but it was not possible to mask hospital staff or investigators. The primary outcome measure was mortality within 90 days of surgery. Analyses were done on an intention-to-treat basis. This study is registered with the ISRCTN registry, number ISRCTN80682973. FINDINGS: Treatment took place between March 3, 2014, and Oct 19, 2015. 22 754 patients were assessed for elegibility. Of 15 873 eligible patients from 93 NHS hospitals, primary outcome data were analysed for 8482 patients in the usual care group and 7374 in the QI group. Eight patients in the usual care group and nine patients in the QI group were not included in the analysis because of missing primary outcome data. The primary outcome of 90-day mortality occurred in 1210 (16%) patients in the QI group compared with 1393 (16%) patients in the usual care group (HR 1·11, 0·96-1·28). INTERPRETATION: No survival benefit was observed from this QI programme to implement a care pathway for patients undergoing emergency abdominal surgery. Future QI programmes should ensure that teams have both the time and resources needed to improve patient care. FUNDING: National Institute for Health Research Health Services and Delivery Research Programme

    Factors Affecting the Retention of Indigenous Australians in the Health Workforce: A Systematic Review

    No full text
    Indigenous Australians are under-represented in the health workforce. The shortfall in the Indigenous health workforce compounds the health disparities experienced by Indigenous Australians and places pressure on Indigenous health professionals. This systematic review aims to identify enablers and barriers to the retention of Indigenous Australians within the health workforce and to describe strategies to assist with development and retention of Indigenous health professionals after qualification. Four electronic databases were systematically searched in August 2017. Supplementary searches of relevant websites were also undertaken. Articles were screened for inclusion using pre-defined criteria and assessed for quality using the Mixed Methods Assessment Tool. Fifteen articles met the criteria for inclusion. Important factors affecting the retention of Indigenous health professionals included work environment, heavy workloads, poorly documented/understood roles and responsibilities, low salary and a perception of salary disparity, and the influence of community as both a strong personal motivator and source of stress when work/life boundaries could not be maintained. Evidence suggests that retention of Indigenous health professionals will be improved through building supportive and culturally safe workplaces; clearly documenting and communicating roles, scope of practice and responsibilities; and ensuring that employees are appropriately supported and remunerated. The absence of intervention studies highlights the need for deliberative interventions that rigorously evaluate all aspects of implementation of relevant workforce, health service policy, and practice change

    Mean number of far responses (out of 20) as a function of disparity.

    No full text
    <p>Negative values represent crossed disparities, postive values uncrossed disparities. Filled symbols show the results for CRDS (•), unfilled symbols for ACRDS (○). Results for the four presentation times are plotted separately ((a) 80 ms; (b) 120 ms; (c) 200 ms; (d) 400 ms). In each case, the dashed line shows chance performance, and error bars show ±1 standard error of the mean.</p

    Binocular energy model responses to CRDS and ACRDS.

    No full text
    <p>(a) Mean responses of a population of position-tuned neurons to CRDS. Results are plotted as a function of the preferred disparity of each neuron, and show the normalised mean response across 100 stimuli, all with a disparity of 12 pixels. The solid black line shows the response of model neurons tuned to a low frequency; the dotted red line the responses of model neurons tuned to a high frequency. The solid vertical line marks zero disparity and the dotted vertical lines show ± the magnitude of the stimulus disparity. The peak response occurs for neurons tuned to the correct disparity for both frequencies. (b) Shows the responses in the same way for ACRDS. In this case, the peaks occur at different disparities (with different signs) for the two frequencies. (c) Responses to CRDS averaged across four frequencies show a clear peak at the correct disparity (d) For ACRDS, the responses averaged across frequency shows a clear minimum at the correct disparity, but no pronounced peak.</p
    corecore